Technology Tales

Adventures & experiences in contemporary technology

Is Vista licensing too restrictive?

15th February 2007

There are things in the Vista EULA that gave me the heepy jeepies when I first saw them. In fact, one provision set off something of a storm across the web in the latter part of 2006. Microsoft in its wisdom went and made everything more explicit and raised cane in doing so. It was their clarification of the one machine one licence understanding that was at the heart of whole furore. The new wording made it crystal clear that you were only allowed to move your licence between machines once and once only. After howls of protest, the XP wording reappeared and things calmed down again.

Around the same time, Paul Thurrott published his take on the Vista EULA on his Windows SuperSite. He takes the view that the new EULA only clarified what in the one XP and that enthusiast PC builders are but a small proportion of the software market. Another interesting point that he makes is that there is no need to license the home user editions of Vista for use in virtual machines because those users would not be doing that kind of thing. The logical conclusion of this argument is that only technical business users and enthusiasts would ever want to do such a thing; I am both. On the same site, Koroush Ghazi of TweakGuides.com offers an alternate view, at Thurrrott’s invitation, from the enthusiast’s’s side. That view takes note of the restrictions of both the licencing and all of the DRM technology that Microsoft has piled into Vista. Another point made is that enthusiasts add a lot to the coffers of both hardware and software producers.

Bit-tech.net got the Microsoft view on the numbers of activations possible with a copy of retail Vista before further action is required. The number comes in at 10 and it seems a little low. However, Vista will differ from XP in that it thankfully will not need reactivation as often. In fact, it will take changing a hard drive and one other component to do it. That’s less stringent than needing reactivation after changing three components from a wider list in a set period like it is in XP. I cannot remember the exact duration of the period in question but 60 days seems to ring a bell.

OEM Vista is more restrictive than this: one reactivation and no more. I learned that from the current issue of PC Plus, the trigger of my concern regarding Windows licensing. Nevertheless, so long as no hard drive changes go on, you should be fine. That said, I do wonder what happens if you add or remove an external hard drive. On this basis at least, it seems OEM is not such a bargain then and Microsoft will not support you anyway.

However, there are cracks appearing in the whole licencing edifice and the whole thing is beginning to look a bit of a mess. Brian Livingston of Windows Secrets has pointed out that you could do a clean install using only the upgrade edition(s) of Vista by installing it twice. The Vista upgrade will upgrade over itself, allowing you access to the activation process. Of course, he recommends that you only do this when you are in already in possession of an XP licence and it does mean that your XP licence isn’t put out of its misery, apparently a surprising consequence of the upgrade process if I have understood it correctly.

However, this is not all. Jeff Atwood has shared on his blog Coding Horror that the 30 grace activation period can be extended in three increments to 120 days. Another revelation was that all Windows editions are on the DVD and it is only the licence key that you have in your possession that will determine the version that you install. In fact, you can install any version for 30 days without entering a licence key at all. Therefore, you can experience 32-bit or 64-bit versions and any edition from Home Basic, Home Premium, Business or Ultimate. The only catch is that once the grace period is up, you have to licence the version that is installed at that point in time.

There is no cracking required to any of the above (a quick Google search digs loads of references to cracking of the Windows activation process). It sounds surprising but it is none other than Microsoft itself who has made these possibilities available, albeit in an undocumented fashion. And the reason is not commercial benevolence but the need to keep their technical support costs under control apparently.

That said, a unintended consequence of the activation period extensibility is that PC hardware enthusiasts, the types who rebuild their machines every few months (in contrast, I regard my main PC as a workhouse and I have no wish to cause undue disruption to my life with this sort of behaviour but each to their own… anyway, it’s not as if they are doing anyone else any harm), would not ever have to activate their copies of Vista, thus avoiding any issues with the 1 or 10 activation limit: an interesting workaround for the limitations in the first place. And all of this is available without (illegally, no doubt) using a fake Windows activation server as has been reported.

With all of these back doors inserted into the activation process by Microsoft itself, it makes some of the more scary provisions look not only over the top but also plain silly: a bit like using a sledgehammer to crack a nut. For instance, there is a provision that Microsoft could kill your Windows licence if it deems that you breached terms of that licence. It looks as if it’s meant to cover the loss in functionality at the end of the activation grace period but it does rather give the appearance that your £370 Vista Ultimate is as ephemeral as a puff of smoke: overdoing that reminder is an almost guaranteed method of encouraging power users jump ship to Linux or another UNIX. And the idea of Windows Genuine Advantage continually phoning home doesn’t provide any great reassurance either. However, it does seem that Microsoft has reactivated XP licences over the phone when reasonable grounds are given: irredeemable loss of system, for example. That ease and cost of technical support returns again. There is corollary to this: make life easy for Microsoft and they won’t bother you very much if at all. Incidentally, if they ever did do a remote control kill of your system, the whole action would be akin to skating on legal thin ice. And I suspect that they may not like making trouble for themselves.

I think I’ll let the dust settle and stay on my XP planet while in a Vista universe. As it happens, Paul Thurrott has a good article on that subject too.

Resolving Windows Update Error 0x80244019 on Windows 10

21st August 2015

In Windows 10, the preferred place to look if you fancy prompting an update of the system is in the Update & Security section of the Settings application. At the top is the Windows Update and the process usually is as simple as pressing the Check for updates button. For most of the time, that has been my experience but it stopped working on my main Windows 10 virtual machine so I needed to resolve the problem.

Initially, going into the Advanced Options section and deselecting the tick box for Give me updates for other Microsoft products when I update Windows helped but it seemed a non-ideal solution so I looked further. It was then that I found that manually resetting a system’s Windows Updates components helped others so I tried that and restarted the system.

The first part of the process was to right-click on the Start Menu button and select the Windows PowerShell (Admin) entry from the menu that appeared. This may be replaced by Command Prompt (Admin) on your system on your machine, but the next steps in the process are the same. In fact, you could include any commands you see below in a script file and execute that if you prefer. Here, I will run through each group in succession.

From either PowerShell or the Command Prompt, you need to stop the Windows Update, Cryptographic, BITS (or Background Intelligent Transfer Service) and MSI Installer services. To accomplish this, execute the following commands at a command prompt:

net stop wuauserv
net stop cryptSvc
net stop bits
net stop msiserver

With the services stopped, it is then possible to rename the SoftwareDistribution and Catroot2 folders so you can refresh everything to remove them. To accomplish this, execute the following pair of commands using either PowerShell or the Command Prompt:

ren C:\Windows\SoftwareDistribution SoftwareDistribution.old
ren C:\Windows\System32\catroot2 Catroot2.old

Once you have the folders renamed, then you can start the Windows Update, Cryptographic, BITS and MSI Installer services by executing the following commands in either PowerShell or the Command Prompt:

net start wuauserv
net start cryptSvc
net start bits
net start msiserver

Once these have completed, you may close the PowerShell or Command Prompt window that you were using and restart the machine. Going into the Update & Security section of the Settings tool afterwards and pressing the Check for updates button now builds new versions of the folders that you renamed and this takes a little while longer than the usual update process. Otherwise, you could let your system rebuild things in its own time. As it happens, I opted for manual intervention and all has worked well since then.

Using the Windows Command Line for Security Administration

24th July 2009

While there are point and click tools for the job, being able to set up new user groups, attaching them to folders and assign uses to them using the command line has major advantages when there are a number to be set up and logs of execution can be retained too. In light of this, it seems a shame that terse documentation along with its being hard to rack down answers to any questions using Google, or whatever happens to be your search engine of choice, makes it less easy to discern what commands need to be run. This is where a book would help but the whole experience is in direct contrast to the community of information providers that is the Linux user community, with Ubuntu being a particular shining example. Saying that, the Windows help system is not so bad once you can track down what you need. For instance, knowing that you need commands like CACLS and NET LOCALGROUP, the ones that have been doing the back work for me, it offers useful information quickly enough. To illustrate the usefulness of the aforementioned commands, here are a few scenarios.

Creating a new group:

net localgroup [name of new group] /comment:”[more verbose description of new group]” /add

Add a group to a folder:

cacls [folder address] /t /e /p [name of group]

The /t switch gets cacls to apply changes to the ACL for the specified folder and all its subfolders, recursive action in other words, while the /e specifies ACL editing rather than its replacement and /p induces replacement of permissions for a given user or group. Using :n, :f, :c or :r directly after the name of a specified user or group assigns no, full, change (write) or read access, respectively. Replacing /p with /r revokes access and leaving off the :n/:f/:c/:r will remove the group or user from the folder.

Add a user to a group:

net localgroup [name of group] [user name (with domain name if on a network)] /add

In addition to NET LOCALGROUP, there is also NET GROUP for wider network operations, something that I don’t have cause to do. Casting the thinking net even wider, I suspect that VB scripting and its ability to tweak the Windows Management Interface might offer more functionality than what is above (PowerShell also comes to mind while we are on the subject) but I am sharing what has been helping me and it can be hard to find if you don’t know where to look.

A little more freedom

10th December 2011

A few weeks ago, I decided to address the fact that my Toshiba laptop have next to useless battery life. The arrival of an issue of PC Pro that included a review of lower cost laptops was another spur and I ended up looking on the web to see what was in stock at nearby chain stores. In the end, I plumped for an HP Pavilion dm4 and it was Argos that supplied yet another piece of computing kit to me. In fact, they seem to have a wider range of laptops than PC World!

The Pavillion dm4 seems to come in two editions and I opted for the heavier of these though it still is lighter than my Toshiba Equium as I found on a recent trip away from home. Its battery life is a revelation for someone who never has got anything better than three hours from a netbook. Having more than five hours certainly makes it suitable for those longer train journeys away from home and I have seen remaining battery life being quoted as exceeding seven hours from time to time though I wouldn’t depend on that.

Of course, having longer battery life would be pointless if the machine didn’t do what else was asked of it. It comes with the 64-bit of Windows 7 and this thought me that this edition of the operating system also runs 32-bit software, a reassuring discovering. There’s a trial version of Office 2010 on there too and, having a licence key for the Home and Student edition, I fully activated it. Otherwise, I added a few extras to make myself at home such as Dropbox and VirtuaWin (for virtual desktops as I would in Linux). While I playing with the idea of adding Ubuntu using Wubi, I am not planning to set up dual booting of Windows and Linux like I have on the Toshiba. Little developments like this can wait.

Regarding the hardware, the CPU is an Intel Core i3 affair and there’s 4 MB of memory on board. The screen is a 14″ one and that makes for a more compact machine without making it too diminutive. The keyboard is of the scrabble-key variety and works well too as does the trackpad. There’s a fingerprint scanner for logging in and out without using a password but I haven’t got to checking how this works so far. It all zips along without any delays and that’s all that anyone can ask of a computer.

There is one eccentricity in my eyes though and it seems that the functions need to be used in combination with Fn for them to work like they would on a desktop machine. That makes functions like changing the brightness of the screen, adjusting the sound of the speakers and turning the WiFi on and off more accessible. My Asus Eee PC netbook and the Toshiba Equium both have things the other way around so I found this set of affairs unusual but it’s just a point to remember rather than being a nuisance.

HP may have had its wobbles regarding its future in the PC making business but the Pavilion feels well put together and very solidly built. It commanded a little premium over the others on my shortlist but it seems to have been worth it. If HP does go down the premium laptop route as has been reported recently, this is the kind of quality that they would need to deliver to just higher prices. Saying that, is this the time to do such a thing would other devices challenging the PC’s place in consumer computing? It would be a shame to lose the likes of the Pavilion dm4 from the market to an act of folly.

A belated goodbye to PC Plus magazine

13th October 2012

Last year, Future Publishing made a loss so something had to be done to address that. Computer magazines such as Linux Format no longer could enclose their cover-mounted discs in elaborate cardboard wallets and moved to simpler sleeves instead. Another casualty has been one of their longest standing titles: PC Plus.

It has been around since 1986 and possibly was one of the publisher’s first titles. It was the late nineties when I first encountered and, for quite a few years afterwards, it was my primary computer magazine of choice every month. The mix of feature articles, reviews and tutorials covering a variety of aspects of personal computing was enough for me. After a while though, it became a bit stale and I stopped buying it regularly. Then, the collection that I had built up was dispatched to the recycling bin and I turned to other magazines.

In the late nineties, Future had a good number of computing titles on magazine shelves in newsagents and there did seem to be some overlap in content. For instance, we had PC Answers and PC Format alongside PC Plus at one point. Now, only PC Format is staying with us and its market seems to be high home computer users such as those interested in PC gaming. .Net, initially a web usage title and now one focussing on website design and development, started from the same era and Linux Format dates from around the turn of the century. Looking back, it looks there was a lot of duplication going on in a heady time of expanding computer usage.

That expansion may have killed off PC Plus in the end. For me, it certainly meant that it no longer was a one stop shop like Dennis’s PC Pro. For instance, the programming and web design content that used to come in PC Plus found itself appearing in .Net and in Linux Format. The appearance of the latter certainly meant that was somewhere else for Linux content; for the record, my first dalliance with SuSE Linux was from a PC Plus cover-mounted disk. The specialisation and division certainly made PC Plus a less essential read than I once thought it.

Of course, we now have an economic downturn and major changes in the world of publishing alongside it. Digital publishing certainly is growing and this isn’t just about websites anymore. That probably explains in part Future’s recent financial performance. Then, when a title like PC Plus is seen as less important, then it can cease to exist but I reckon that it’s the earlier expansion that really did for it. If Future had one computing title that contained extensive reviews and plenty of computing tutorials with sections of programming and open source software, who knows what may have happened. Maybe consolidating the other magazines into that single title would have been an alternative but my thinking is that it wouldn’t have been commercially realistic. Either way, the present might have be very different and PC Plus would be a magazine that I’d be reading every month. That isn’t the case of course and it’s sad to see it go from newstands even if the reality was that it left us quite a while ago in reality.

When CRON is stalled by incorrect file and folder permissions

8th October 2021

During the past week, I rebooted my system only to find that a number of things no longer worked and my Pi-hole DNS server was among them. Having exhausted other possibilities by testing out things on another machine, I did a status check when I spotted a line like the following in my system logs and went investigating further:

cron[322]: (root) INSECURE MODE (mode 0600 expected) (crontabs/root)

It turned out to be more significant than I had expected because this was why every CRON job was failing and that included the network set up needed by Pi-hole; a script is executed using the @reboot directive to accomplish this and I got Pi-hole working again by manually executing it. The evening before, I did make some changes to file permissions under /var/www but I was not expecting it to affect other parts of /var though that may have something to do with some forgotten heavy-handedness. The cure was to issue a command like the following for execution in a terminal session:

sudo chmod -R 600 /var/spool/cron/crontabs/

Then, CRON itself needed starting since it had not not running at all and executing this command did the needful without restarting the system:

sudo systemctl start cron

That outcome was proved by executing the following command to issue some terminal output that include the welcome text “active (running)” highlighted in green:

sudo systemctl status cron

There was newly updated output from a frequently executing job that checked on web servers for me but this was added confirmation. It was a simple solution to a perplexing situation that led up all sorts of blind alleys before I alighted on the right solution to the problem.

A waiting game

20th August 2011

Having been away every weekend in July, I was looking forward to a quiet one at home to start August. However, there was a problem with one of my websites hosted by Fasthosts that was set to occupy me for the weekend and a few weekday evenings afterwards.

The issue appeared to be slow site response so I followed advice given to me by second line support when this website displayed the same type of behaviour: upgrade from Apache 1.3 to 2.2 using the control panel. Unfortunately for me, that didn’t work smoothly at all and there seemed to be serious file loss as a result. Raising a ticket with the support desk only got me the answer that I had to wait for completion and I now have come to the conclusion that the migration process may have got stuck somewhere along the way. Maybe another ticket is in order.

There were a number of causes of the waiting that gave rise to the title of this post. Firstly, support for low costing isn’t exactly timely and I do wonder if it’s any better for more prominent websites. Restoration of websites by FTP is another activity that takes up plenty of time as does rebuilding databases and populating them with data. Lastly, there’s changing the DNS details for a website. In hindsight, there may be ways of reducing the time demands of these. For instance, contacting a support team by telephone may be quicker unless there is a massive queue awaiting attention and there was a wait of several hours one night when a security changeover affected a multitude of Fasthosts users. Of course, it is not a panacea at the best of times as we have known since all those stories began to do the rounds in the middle of the 1990’s. Doing regular backups would help the second though the ones that I was using for the restoration weren’t too bad at all. Nevertheless, they weren’t complete so there was unfinished business that required resolution later. The last of these is helped along by more regular PC restarts so that unexpected discovery will remain a lesson for the future though I don’t plan on moving websites around for a while. After all, getting DNS details propagated more quickly really is a big help.

While awaiting a response from Fasthosts, I began to ponder the idea of using an alternative provider. Perusal of the latest digital edition of .Net (I now subscribe to the non-paper edition so as to cut down on the clutter caused by having paper copies about the place) ensued before I decided to investigate the option of using Webfusion. Having decided to stick with shared hosting, I gave their Unlimited Linux option a go. For someone accustomed to monthly billing, it was unusual to see annual biannual and triannual payment schemes too. The first of these appears to be the default option so a little care and attention is needed if you want something else. In order to encourage you to stay with Webfusion longer, the per month is on sliding scale: the longer the period you buy, the lower the cost of a month’s hosting.

Once the account was set up, I added a database and set to the long process of uploading files from my local development site using FileZilla. Having got a MySQL backup from the Fasthosts site, I used the provided PHPMyAdmin interface to upload the data in pieces not exceeding the 8 MB file size limitation. It isn’t possible to connect remotely to the MySQL server using the likes of MySQL Administrator so I bear with this not so smooth process. SSH is another connection option that isn’t available but I never use it much on Fasthosts sites anyway. There were some questions to the support people along and the first of these got a timely answer though later ones took longer before I got an answer. Still, getting advice on the address of the test website was a big help while I was sorting out the DNS changeover.

Speaking of the latter, it took a little doing and not little poking around Webfusion’s FAQ’s before I made it happen. First, I tried using name servers that I found listed in one of the articles but this didn’t seem to achieve the end that I needed. Mind you, I would have seen the effects of this change a little earlier if I had rebooted my PC earlier than I did than I did but it didn’t occur to me at the time. In the end, I switched to using my domain provider’s name servers and added the required information to them to get things going. It was then that my website was back online in some fashion so I could any outstanding loose ends.

With the site essentially operating again, it was time to iron out the rough edges. The biggest of these was that MOD_REWRITE doesn’t seem to work the same on the Webfusion server like it does on the Fasthosts ones. This meant that I needed to use the SCRIPT_URI CGI variable instead of PATH_INFO in order to keep using clean URL’s for a PHP-powered photo gallery that I have. It took me a while to figure that out and I felt much better when I managed to get the results that I needed. However, I also took the chance to tidy up site addresses with redirections in my .htaccess file in an attempt to ensure that I lost no regular readers, something that I seem to have achieved with some success because one such visitor later commented on a new entry in the outdoors blog.

Once any remaining missing images were instated or references to them removed, it was then time to do a full backup for sake of safety. The first of these activities was yet another consumer while the second didn’t take so long and I need to do this more often in case anything happens. Hopefully though, the relocated site’s performance continues to be as solid as it is now.

The question as to what to do with the Fasthosts webspace remains outstanding. Currently, they are offering free upgrades to existing hosting packages so long as you commit for a year. After my recent experience, I cannot say that I’m so sure about doing that kind of thing. In fact, the observation leaves me wondering if instating that very extension was the cause of breaking my site. In fact, it appears that the migration from Apache 1.3 to 2.2 seems to have got stuck for whatever reason. Maybe another ticket should be raised but I am not decided on that yet. All in all, what happened to that Fasthosts website wasn’t the greatest of experiences but the service offered by Webfusion is rock solid thus far. While wondering if the service from Fasthosts wasn’t as good as it once was, I’ll keep an open mind and wait to see if my impressions change over time.

Installing VMware Player 4.04 on Linux Mint 13

15th July 2012

Curiosity about the Release Preview of Windows 8 saw me running into bother when trying to see what it’s like in a VirtualBox VM. While doing some investigations on the web, I saw VMware Player being suggested as an alternative. Before discovering VirtualBox, I did have a licence for VMware Workstation and was interested in seeing what Player would have to offer. The, it was limited to running virtual machines that were created using Workstation. Now, it can create and manage them itself and without any need to pay for the tool either. Registration on VMware’s website is a must for downloading it though but that’s no monetary cost.

One I had downloaded Player from the website, I needed to install it on my machine. There are Linux and Windows versions and it was the former that I needed and there are 32-bit and 64-bit variants so you need to know what your system is running. With the file downloaded, you need to set it as executable and the following command should do the trick once you are in the right directory:

chmod +x VMware-Player-4.0.4-744019.i386.bundle

Then, it needs execution as a superuser. With sudo access for my user account, it was a matter of issuing the following command and working through the installation screens to instate the Player software on the system:

sudo ./VMware-Player-4.0.4-744019.i386.bundle

Those screens proved easy for me to follow so life would have been good if that were all that was needed to get Player working on my PC. Having Linux Mint 13 means that the kernel is of the 3,2 stock and that means using a patch to finish off the Player installation because the required VMware kernel modules seem to silently fail to compile during the installation process. This only manifests itself when you attempt to start Player afterwards to find a module installation screen appear. That wouldn’t be an issue of itself were it not for the compilation failure of the vmnet module and subsequent inability to start VMware services on the machine. There is a prompt to peer into the log file for the operation and that is a little uninformative for the non-specialist.

Rummaging around the web brought me to the requisite patch and it will work for Player 4.0.3 and Workstation 8.0.2 by default. Doing some tweaking allowed me to make it work for Player 4.04 too. My first step was to extract the contents of the tarball to /tmp where I could edit patch-modules_3.2.0.sh. Line 8 was changed to the following:

plreqver=4.0.4

With the amendment saved, it was time to execute the shell script as a superuser having made it executable before hand. This can be accomplished using the following command:

chmod +x patch-modules_3.2.0.sh && sudo ./patch-modules_3.2.0.sh

With that completed successfully, VMware Player ran as it should. An installation of Windows 8 into a new VM ran very smoothly and I was impressed with performance and responsiveness of the operating system within a Player VM. There are a few caveats though. First, it doesn’t run at all well with VMware Tools so it’s best to leave them uninstalled and it doesn’t seem to need them either; it was possible to set the resolution to the same as my screen and use the CTRL+ALT+ENTER shortcut to drop in and out of full screen mode anyway. Second, the unattended Windows installation wasn’t the way forward for setting up the VM but it was no big deal to have that experiment thwarted. The feature remains an interesting one though.

With Windows 8 running so well in Player, I was reminded of the sluggish nature of my Windows 7 VM and an issue with a Fedora 17 one too. The result was that I migrated the Windows 7 VM from VirtualBox to VMware and all is so much more responsive. Getting it there took not a little tinkering so that’s a story for another entry. On the basis of my experiences so far, I reckon that VMware Player will remain useful to me for a little while yet. Resolving the installation difficulty was worth that extra effort.

So you just need a web browser?

21st November 2009

When Google announced that it was working on an operating system, it was bound to result in a frisson of excitement. However, a peek at the preview edition that has been doing the rounds confirms that Chrome OS is a very different beast from those operating systems to which we are accustomed. The first thing that you notice is that it only starts up the Chrome web browser. In this, it is like a Windows terminal server session that opens just one application. Of course, in Google’s case, that one piece of software is the gateway to its usual collection of productivity software like GMail, Calendar, Docs & Spreadsheets and more. Then, there are offerings from others too with Microsoft just beginning to come into the fray to join Adobe and many more. As far as I can tell, all files are stored remotely and I reckon that adding the possibility of local storage and management of those local files would be a useful enhancement.

With Chrome OS, Google’s general strategy starts to make sense. First create a raft of web applications, follow them up with a browser and then knock up an operating system. It just goes to show that Google Labs doesn’t just churn out stuff for fun but that there is a serious point to their endeavours. In fact, you could say that they sucked us in to a point along the way. Speaking for myself, I may not entrust all of my files to storage in the cloud but I am perfectly happy to entrust all of my personal email activity to GMail. It’s the widespread availability and platform independence that has done it for me. For others spread between one place and another, the attractions of Google’s other web apps cannot be understated. Maybe, that’s why they are not the only players in the field either.

With the rise of mobile computing, that portability is the opportunity that Google is trying to use to its advantage. For example, mobile phones are being used for things now that would have been unthinkable a few years back. Then, there’s the netbook revolution started by Asus with its Eee PC. All of this is creating an ever internet connected bunch of people so have devices that connect straight to the web like they would with Chrome OS has to be a smart move. Some may decry the idea that Chrome OS is going to be available on a device only basis but I suppose they have to make money from this too; search can only pay for so much and they have experience with Android too.

There have been some who wondered about Google’s activities killing off Linux and giving Windows a good run for its money; Chrome OS seems to be a very different animal to either of these. It looks as if it is a tool for those on the move, an appliance rather than the pure multipurpose tools that operating systems usually are. If there is a symbol of what an operating system usually means for me, it’s the ability to start with a bare desktop and decide what to do next. Transparency is another plus point and the Linux command line had that in spades. For those who view PC’s purely as means to get things done, such interests are peripheral and it is for these that the likes of Chrome OS has been created. In other word, the Linux community need to keep an eye on what Google is doing but should not take fright because there are other things that Linux always will have as unique selling points. The same sort of thing applies to Windows too but Microsoft’s near stranglehold on the enterprise market will take a lot of loosening, perhaps keeping Chrome OS in the consumer arena. Counterpoints to that include the use GMail for enterprise email by some companies and the increasing footprint of web-based applications, even bespoke ones, in business computing. In fact, it’s the latter than can be blamed for any tardiness in Internet Explorer development. In summary, Chrome OS is a new type of thing rather than a replacement for what’s already there. We may find that co-existence is how things turn out but it means for Linux in the netbook market is another matter. Only time will tell on that one.

Debian & Derivatives

21st September 2012

Debian is one of the oldest Linux distributions and has spawned many of derivatives, with Ubuntu being the most notable of the lot. It, too, has a range of ports that include one using a BSD kernel (GNU/kFreeBSD) too. Mainly though, it is the x86 and AMD64 architecture Linux variants that get the most attention.

After all, I do have something of a soft spot for Debian, mainly because it was loaded on a backup machine that was pressed into service when my main home system went belly up on me in 2009. It may attract its aficionados (and there is an administrator’s manual so that gives you an idea of who gets attracted to the OS) but that does nothing to detract from its usability based on my experience of using it. Well, Ubuntu did start from a good base when it did.

That was not to be the end of my dalliance with Debian and I still have virtual machines loaded with it today. The fact that new versions of the operating system may not come around very frequently can be an attraction that is lost on those who always want the latest software. When it comes to GNOME Shell, maintaining the same version longer than six months and wondering if favoured extensions will get updated to a new version has something going for it. Long-term support helps too, a growing trend in the world of Linux.

The changes introduced with GNOME 3 have been contentious and the Debian team has toyed with using other default desktop environments yet always returned to it regardless. Still, the range of desktop environments that you can use with Debian has expanded with both Cinnamon and KDE being options that come to mind and there are others.

As you might tell, I do have a soft spot for Debian and its focus on stability is at the heart of that. Maybe that is why so it has so many variants, like Ubuntu and Linux Mint, to name just two. Whenever a new version does appear, it may not have the latest versions of software, but there are times when experimentation needs to be tamed and it is good to know that upheaval hardly is a regular occurrence either.

More Options Based on Debian

So many other distros are based on Debian that there needs to be a list of them on here. Ubuntu and Linux Mint are the most notable of the lot, but there are many others, as you will see below. Others may fall in more specific functional listings that you can find via the sidebar.

Bodhi

An up-and-coming Ubuntu derivative that uses the Enlightenment desktop environment.

Deepin

This is another derivative of Ubuntu that is gaining favour thanks to the elegance of its desktop. That it’s essentially GNOME 3 is saying something about how GNOME Shell can be customised too.

Devuan

When Debian changed from sysvinit to systemd for managing system start-up and services, there were those who disagreed strongly with the decision. Though the Debian team did vote for this under the bonnet change, the detractors set up Devuan as an alternative downstream project that allows them to continue as they were.

Elementary OS

It has Ubuntu at its heart, but a lot of work has happened to make it feel as if that isn’t the case.

ExTix

What you have here is a Swedish respin of Deepin Linux. From the website, it appears that freedom is a concern but there needs to be more made of the reason for doing what they are doing.

Finnix

This is not a full desktop option since it contains many system utilities for maintenance and recovery. What you get on startup is a root command line with everything available to you.

Freespire

If you can forego the support that Linspire offers its customers, then this can come to you free of charge. The basis here is Ubuntu with different choices like the inclusion of Flatpak as well as a different software selection that includes the Brave browser and OnlyOffice.

Grml

This is a remix of Debian that uses the Zsh shell that runs exclusively as a live distro, either on a DC or on a USB flash drive.

Knoppix

If I recall correctly, this was the first-ever distribution to offer a Live CD version of itself and the innovation has taken off to the level that almost all of its competitors now offer the same. Its creator also writes a helpdesk column for Linux Magazine.

Kubuntu

Until the 12.04, release this was sponsored by Canonical, but that has changed with Blue Systems taking over for the 12.10 release. It remains the KDE flavour of Ubuntu despite this and that seems to remain the case for the foreseeable future.

Linux Lite

As the name suggests; thus, Ubuntu variant is suitable for older computer hardware. Also, it is based on LTS releases of Ubuntu, so there is no need to upgrade every six months either.

Linux Mint

The main distro may be based on Ubuntu, but there is a Debian-based version, LMDE, too. The latter only comes with the Cinnamon desktop environment while the former comes with Cinnamon, MATE and Xfce. My everyday choice has been the Cinnamon edition based on Ubuntu even if the Debian version has been checked out for a time as well; LMDE felt a little clunkier to me so I am staying mainstream for my purposes. All in all, Linux Mint feels far more community-oriented with less drama, which is why it gets my vote for everyday computing.

Linuxfx

One of the promises here is the running of Windows applications using Wine along with the running of Android apps. Also, the chosen desktop environment is KDE Plasma.

Lubuntu

Debian & Derivatives

The first place I ever tried Lubuntu was on a now elderly Asus Eee PC netbook. LXDE is the desktop environment choice here too and it’s very lightweight and so fits the bill for netbooks and PC’s that are getting on in years. The included software is chosen for being lightweight, so Chromium appeared instead of Firefox, but the accessibility of Ubuntu repositories meant that LibreOffice and the aforementioned Firefox never took long to appear on where I installed Lubuntu. Originally, it was an independent project but it impressed Mark Shuttleworth enough to gain official support such that new versions now appear on the same day as the main Ubuntu release itself.

LXLE

The website for this project disappeared for a while but it seems to be back again, so the entry reappears in this list. It is yet another lightweight distro for use on an ageing computer, as if Linux does not provide enough of these already. However, each has their own aesthetics so that may have something to do with the number of available options.

MX

In the first decade of the century, Warren Woodford created a distro called Mepis, but that project was discontinued in 2009. In response, members of the antiX and Mepis teams came together to create MX as a successor to Mepis. Today, the project remains active, and the latest version comes with XFCE, KDE and Fluxbox desktop environment choices. The antiX involvement adds a little extra computing efficiency too.

Netrunner

When Kubuntu existed, the need for this was lost on me, but the continued existence of this project will serve those who were left without an option after the official Ubuntu derivative. The effort is sponsored by Blue Systems.

Nitrux

Here is one of the strap lines for Nitrux: Powered by Debian, KDE Plasma and Frameworks, and AppImages. The last on the list refers to an ongoing trend for packaging applications within containers for desktop usage. All you need to do is drop the AppImage file somewhere, make it executable and run that.

Pardus

There was a time when this Turkish distro made something of a splash, but those days are gone and I even thought the project was moribund only to get corrected. As it happens, both GNOME and XFCE desktop environments are offered for your choosing.

Peppermint OS

Both Debian and Devuan form the basis for spins of this distro. XFCE is the chosen desktop environment so that should be more than usable for most.

Pop!_OS

If you buy a computer from System76, then Pop!_OS is the operating system that you get with it since the project is orchestrated by the same company. You can download installation media for other computers too and the target audience includes those working in science, technology, engineering and mathematical sectors as those who are content producers. There is a bespoke desktop environment called Cosmic in place of more commonplace options.

Q4OS

Prague appears to be the development HQ for this distro these days. For desktop environments, it has KDE but also a unique choice in the less well-known Trinity, and it has dual desktop capability. Another interesting feature is the way it runs alongside Windows. It also runs on ARM as well as x86.

Siduction

This is a packaging of software from Debian’s unstable branch, always called Sid and so the inspiration for the name of this distro. There are quarterly releases and five desktop environments are on offer, GNOME, LXDE, XFCE, KDE SC and Razor-QT. For whatever reason, there is a version with no desktop environment at all, but that might be for the sort of DIY enthusiast who enjoys the likes of Arch.

SparkyLinux

Using the testing branch of Debian, this rolling release distro comes in E17, LXDE, MATE and Razor-qt flavours. There’s also a command-line edition for those wanting to build their desktop environment instead of having it pre-packaged for them.

SpiralLinux

What you have here is a respin of Debian that uses its software repositories directly while adding a dash of added user-friendliness. It probably is for those who want to stay closer to the Debian base than Ubuntu does, yet a recent magazine review commented that Ubuntu does user-friendliness better anyway. Even so, Debian does not offer live DVD/USB images like you get here.

Ubuntu

Debian & DerivativesIt was Ubuntu that steered me into the world of full-time Linux usage after a series of Windows XP meltdowns.  In contrast to earlier dalliances with Linux, all of my hardware was supported without any bother and everything seemed to work straight away. Whatever issues I faced in those early months, there seemed to be an answer in an Ubuntu forum or blog for my problem even if some needed a spot of thought when it came to their implementation.

Ubuntu Budgie

Budgie may be an upstart desktop environment, but that has not stopped an official Ubuntu spin from using it. Things look swish so it will be interesting to watch this.

Ubuntu MATE

In a sense, this is going back to how Ubuntu was before the arrival of GNOME Shell or Unity, both of which caused controversy, and it is a community effort and not one sponsored by Canonical. With Linux Mint having the MATE desktop too, you might be tempted to ask what this offers but the decision by the Linux Mint team to go exclusively for a long-term support model answers that. In contrast, the next release of Ubuntu MATE will be 14.10 so you get an intermediate release this way and in situ distro version updates should be a possibility too, another practice that the Linux Mint team reckons is undesirable. It will be interesting to see how many go for this.

Ubuntu Studio

This is a spin of Ubuntu for content creators. Here, the focus is on audio, graphics, video and photography. The main desktop is KDE but you also can add the Ubuntu Studio experience to other favours of Ubuntu, increasing the choices of desktop environment.

Voyager

This is a French project with variants based on Debian and on Ubuntu. The website has sections about gaming and ChatGPT, among other things. For English speakers, text comes up in French before converting to English; patience is needed to avoid confusion.

Xubuntu

This is a variant of Ubuntu using the Xfce desktop environment. As such, that makes it a bit lighter on computer power than the main distro would be. Having tried it a few times on various machines, it remains very usable and has a more conventional user interface too.

Zentyal

From the website, this would appear to be a mail server operating system that has a user-friendly feel to it. However, Linux Magazine has left me with the impression that its talents go beyond this and that activities like serving websites are supported. These are things that I have yet to explore with the VirtualBox instance that I have set up to see what it can do.

Zorin

This distro is mocking up its desktop environments to ape those of Windows and macOS, and is its major selling point. That’s not all, since they are selling laptops with the OS installed on them too. Additionally, enterprise management services are another product line here.

 

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.